403 research outputs found

    The Origins of Computational Mechanics: A Brief Intellectual History and Several Clarifications

    Get PDF
    The principle goal of computational mechanics is to define pattern and structure so that the organization of complex systems can be detected and quantified. Computational mechanics developed from efforts in the 1970s and early 1980s to identify strange attractors as the mechanism driving weak fluid turbulence via the method of reconstructing attractor geometry from measurement time series and in the mid-1980s to estimate equations of motion directly from complex time series. In providing a mathematical and operational definition of structure it addressed weaknesses of these early approaches to discovering patterns in natural systems. Since then, computational mechanics has led to a range of results from theoretical physics and nonlinear mathematics to diverse applications---from closed-form analysis of Markov and non-Markov stochastic processes that are ergodic or nonergodic and their measures of information and intrinsic computation to complex materials and deterministic chaos and intelligence in Maxwellian demons to quantum compression of classical processes and the evolution of computation and language. This brief review clarifies several misunderstandings and addresses concerns recently raised regarding early works in the field (1980s). We show that misguided evaluations of the contributions of computational mechanics are groundless and stem from a lack of familiarity with its basic goals and from a failure to consider its historical context. For all practical purposes, its modern methods and results largely supersede the early works. This not only renders recent criticism moot and shows the solid ground on which computational mechanics stands but, most importantly, shows the significant progress achieved over three decades and points to the many intriguing and outstanding challenges in understanding the computational nature of complex dynamic systems.Comment: 11 pages, 123 citations; http://csc.ucdavis.edu/~cmg/compmech/pubs/cmr.ht

    Structural Drift: The Population Dynamics of Sequential Learning

    Get PDF
    We introduce a theory of sequential causal inference in which learners in a chain estimate a structural model from their upstream teacher and then pass samples from the model to their downstream student. It extends the population dynamics of genetic drift, recasting Kimura's selectively neutral theory as a special case of a generalized drift process using structured populations with memory. We examine the diffusion and fixation properties of several drift processes and propose applications to learning, inference, and evolution. We also demonstrate how the organization of drift process space controls fidelity, facilitates innovations, and leads to information loss in sequential learning with and without memory.Comment: 15 pages, 9 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/sdrift.ht

    Chaotic Crystallography: How the physics of information reveals structural order in materials

    Full text link
    We review recent progress in applying information- and computation-theoretic measures to describe material structure that transcends previous methods based on exact geometric symmetries. We discuss the necessary theoretical background for this new toolset and show how the new techniques detect and describe novel material properties. We discuss how the approach relates to well known crystallographic practice and examine how it provides novel interpretations of familiar structures. Throughout, we concentrate on disordered materials that, while important, have received less attention both theoretically and experimentally than those with either periodic or aperiodic order.Comment: 9 pages, two figures, 1 table; http://csc.ucdavis.edu/~cmg/compmech/pubs/ChemOpinion.ht

    Structure or Noise?

    Get PDF
    We show how rate-distortion theory provides a mechanism for automated theory building by naturally distinguishing between regularity and randomness. We start from the simple principle that model variables should, as much as possible, render the future and past conditionally independent. From this, we construct an objective function for model making whose extrema embody the trade-off between a model's structural complexity and its predictive power. The solutions correspond to a hierarchy of models that, at each level of complexity, achieve optimal predictive power at minimal cost. In the limit of maximal prediction the resulting optimal model identifies a process's intrinsic organization by extracting the underlying causal states. In this limit, the model's complexity is given by the statistical complexity, which is known to be minimal for achieving maximum prediction. Examples show how theory building can profit from analyzing a process's causal compressibility, which is reflected in the optimal models' rate-distortion curve--the process's characteristic for optimally balancing structure and noise at different levels of representation.Comment: 6 pages, 2 figures; http://cse.ucdavis.edu/~cmg/compmech/pubs/son.htm

    Informational and Causal Architecture of Discrete-Time Renewal Processes

    Full text link
    Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states), calculate the historical memory capacity required to store those states (statistical complexity), delineate what information is predictable (excess entropy), and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use these formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state epsilon-machine presentation. All in all, the results lay the groundwork for analyzing processes with infinite statistical complexity and infinite excess entropy.Comment: 18 pages, 9 figures, 1 table; http://csc.ucdavis.edu/~cmg/compmech/pubs/dtrp.ht

    Synchronizing to the Environment: Information Theoretic Constraints on Agent Learning

    Full text link
    We show that the way in which the Shannon entropy of sequences produced by an information source converges to the source's entropy rate can be used to monitor how an intelligent agent builds and effectively uses a predictive model of its environment. We introduce natural measures of the environment's apparent memory and the amounts of information that must be (i) extracted from observations for an agent to synchronize to the environment and (ii) stored by an agent for optimal prediction. If structural properties are ignored, the missed regularities are converted to apparent randomness. Conversely, using representations that assume too much memory results in false predictability.Comment: 6 pages, 5 figures, Santa Fe Institute Working Paper 01-03-020, http://www.santafe.edu/projects/CompMech/papers/stte.htm

    Signatures of Infinity: Nonergodicity and Resource Scaling in Prediction, Complexity, and Learning

    Full text link
    We introduce a simple analysis of the structural complexity of infinite-memory processes built from random samples of stationary, ergodic finite-memory component processes. Such processes are familiar from the well known multi-arm Bandit problem. We contrast our analysis with computation-theoretic and statistical inference approaches to understanding their complexity. The result is an alternative view of the relationship between predictability, complexity, and learning that highlights the distinct ways in which informational and correlational divergences arise in complex ergodic and nonergodic processes. We draw out consequences for the resource divergences that delineate the structural hierarchy of ergodic processes and for processes that are themselves hierarchical.Comment: 8 pages, 1 figure; http://csc.ucdavis.edu/~cmg/compmech/pubs/soi.pd
    • …
    corecore